120 research outputs found

    A Compact Linear Programming Relaxation for Binary Sub-modular MRF

    Full text link
    We propose a novel compact linear programming (LP) relaxation for binary sub-modular MRF in the context of object segmentation. Our model is obtained by linearizing an l1+l_1^+-norm derived from the quadratic programming (QP) form of the MRF energy. The resultant LP model contains significantly fewer variables and constraints compared to the conventional LP relaxation of the MRF energy. In addition, unlike QP which can produce ambiguous labels, our model can be viewed as a quasi-total-variation minimization problem, and it can therefore preserve the discontinuities in the labels. We further establish a relaxation bound between our LP model and the conventional LP model. In the experiments, we demonstrate our method for the task of interactive object segmentation. Our LP model outperforms QP when converting the continuous labels to binary labels using different threshold values on the entire Oxford interactive segmentation dataset. The computational complexity of our LP is of the same order as that of the QP, and it is significantly lower than the conventional LP relaxation

    Improving the achievements of non-traditional students on computing courses at one wide access university

    Get PDF
    This longitudinal study set out to improve the retention and achievements of diverse students on computing courses in one wide access university, firstly by early identification of students at risk of poor performance and secondly by developing and implementing an intervention programme. Qualitative data were obtained using the ASSIST questionnaire, by focus group discussions and an open-ended questionnaire on students’ experiences of the transition to higher education (HE). Quantitative data on student characteristics and module results were obtained from Registry. Statistical analyses were performed using SPSS version 10. The study comprised two phases where phase one sought to enable the early detection of students at risk of poor performance by investigating the data set for patterns that may emerge between student achievement at Level 1 and entrance qualification, feeder institution, approaches to learning, conceptions of learning, course and teaching preferences and motivation. Phase one findings showed a trend of poorer performance by students who entered computing courses in HE with an AVCE entrance qualification. It was also shown that mature students scored more highly on the deep approach scale compared to their younger counterparts. Phase two investigated the data set for patterns that may emerge between student achievement at Level 2 and entrance qualification, approaches to learning, conceptions of learning and course and teaching preferences. Phase two, using action research, also sought to develop an intervention programme from the findings. This intervention programme was designed to improve aspects of information delivery to students; the personal tutor system, assessment régimes, Welcome Week, and teaching and learning. Piloting, evaluation and refinement of the intervention programme brought changes that were seen as positive by both staff and students. These changes included the Welcome Week Challenge which involved students in activities that sought to enhance students’ interactions with peers, personal tutors and the school and university facilities. These findings have shown that, for staff in wide access HE institutions, some knowledge of the previous educational experiences of their students, and the requirements of those students, are vital in providing a smooth transition to HE. A model of the characteristics of a successful student on computing courses in HE and a model for enhanced retention of diverse students on computing courses in HE were developed from the research findings. These models provide a significant contribution to current knowledge of those factors that enhance a smooth transition to HE and the characteristics of a successful student in a wide access university.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Robust Obstacle Detection based on Dense Disparity Maps

    Get PDF
    Obstacle detection is an important component for many autonomous vehicle navigation systems. Several methods for obstacle detection have been proposed using various active sensors such as radar, sonar and laser range finders. Vision based techniques have the advantage of low cost and provide a large amount of information about the environment around an intelligent vehicle. This paper deals with the development of an accurate and efficient vision based obstacle detection method which relies on a wavelet analysis. The development system will be integrated on the Cybercar platform which is a road vehicle with fully automated driving capabilities

    Motion corrected 3D reconstruction of the fetal thorax from prenatal MRI

    Get PDF
    In this paper we present a semi-automatic method for analysis of the fetal thorax in genuine three-dimensional volumes. After one initial click we localize the spine and accurately determine the volume of the fetal lung from high resolution volumetric images reconstructed from motion corrupted prenatal Magnetic Resonance Imaging (MRI). We compare the current state-of-the-art method of segmenting the lung in a slice-by-slice manner with the most recent multi-scan reconstruction methods. We use fast rotation invariant spherical harmonics image descriptors with Classification Forest ensemble learning methods to extract the spinal cord and show an efficient way to generate a segmentation prior for the fetal lung from this information for two different MRI field strengths. The spinal cord can be segmented with a DICE coefficient of 0.89 and the automatic lung segmentation has been evaluated with a DICE coefficient of 0.87. We evaluate our method on 29 fetuses with a gestational age (GA) between 20 and 38 weeks and show that our computed segmentations and the manual ground truth correlate well with the recorded values in literature

    Iterative algorithms for total variation-like reconstructions in seismic tomography

    Full text link
    A qualitative comparison of total variation like penalties (total variation, Huber variant of total variation, total generalized variation, ...) is made in the context of global seismic tomography. Both penalized and constrained formulations of seismic recovery problems are treated. A number of simple iterative recovery algorithms applicable to these problems are described. The convergence speed of these algorithms is compared numerically in this setting. For the constrained formulation a new algorithm is proposed and its convergence is proven.Comment: 28 pages, 8 figures. Corrected sign errors in formula (25

    Solving the Uncalibrated Photometric Stereo Problem using Total Variation

    Get PDF
    International audienceIn this paper we propose a new method to solve the problem of uncalibrated photometric stereo, making very weak assumptions on the properties of the scene to be reconstructed. Our goal is to solve the generalized bas-relief ambiguity (GBR) by performing a total variation regularization of both the estimated normal field and albedo. Unlike most of the previous attempts to solve this ambiguity, our approach does not rely on any prior information about the shape or the albedo, apart from its piecewise smoothness. We test our method on real images and obtain results comparable to the state-of-the-art algorithms

    Implementation of an Optimal First-Order Method for Strongly Convex Total Variation Regularization

    Get PDF
    We present a practical implementation of an optimal first-order method, due to Nesterov, for large-scale total variation regularization in tomographic reconstruction, image deblurring, etc. The algorithm applies to ÎĽ\mu-strongly convex objective functions with LL-Lipschitz continuous gradient. In the framework of Nesterov both ÎĽ\mu and LL are assumed known -- an assumption that is seldom satisfied in practice. We propose to incorporate mechanisms to estimate locally sufficient ÎĽ\mu and LL during the iterations. The mechanisms also allow for the application to non-strongly convex functions. We discuss the iteration complexity of several first-order methods, including the proposed algorithm, and we use a 3D tomography problem to compare the performance of these methods. The results show that for ill-conditioned problems solved to high accuracy, the proposed method significantly outperforms state-of-the-art first-order methods, as also suggested by theoretical results.Comment: 23 pages, 4 figure

    Templates for Convex Cone Problems with Applications to Sparse Signal Recovery

    Full text link
    This paper develops a general framework for solving a variety of convex cone problems that frequently arise in signal processing, machine learning, statistics, and other fields. The approach works as follows: first, determine a conic formulation of the problem; second, determine its dual; third, apply smoothing; and fourth, solve using an optimal first-order method. A merit of this approach is its flexibility: for example, all compressed sensing problems can be solved via this approach. These include models with objective functionals such as the total-variation norm, ||Wx||_1 where W is arbitrary, or a combination thereof. In addition, the paper also introduces a number of technical contributions such as a novel continuation scheme, a novel approach for controlling the step size, and some new results showing that the smooth and unsmoothed problems are sometimes formally equivalent. Combined with our framework, these lead to novel, stable and computationally efficient algorithms. For instance, our general implementation is competitive with state-of-the-art methods for solving intensively studied problems such as the LASSO. Further, numerical experiments show that one can solve the Dantzig selector problem, for which no efficient large-scale solvers exist, in a few hundred iterations. Finally, the paper is accompanied with a software release. This software is not a single, monolithic solver; rather, it is a suite of programs and routines designed to serve as building blocks for constructing complete algorithms.Comment: The TFOCS software is available at http://tfocs.stanford.edu This version has updated reference
    • …
    corecore